351 research outputs found

    Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data

    Get PDF
    We provide formal definitions and efficient secure techniques for - turning noisy information into keys usable for any cryptographic application, and, in particular, - reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a "fuzzy extractor" reliably extracts nearly uniform randomness R from its input; the extraction is error-tolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A "secure sketch" produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce error-prone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of ``closeness'' of input data, such as Hamming distance, edit distance, and set difference.Comment: 47 pp., 3 figures. Prelim. version in Eurocrypt 2004, Springer LNCS 3027, pp. 523-540. Differences from version 3: minor edits for grammar, clarity, and typo

    Robust Randomness Amplifiers: Upper and Lower Bounds

    Get PDF
    A recent sequence of works, initially motivated by the study of the nonlocal properties of entanglement, demonstrate that a source of information-theoretically certified randomness can be constructed based only on two simple assumptions: the prior existence of a short random seed and the ability to ensure that two black-box devices do not communicate (i.e. are non-signaling). We call protocols achieving such certified amplification of a short random seed randomness amplifiers. We introduce a simple framework in which we initiate the systematic study of the possibilities and limitations of randomness amplifiers. Our main results include a new, improved analysis of a robust randomness amplifier with exponential expansion, as well as the first upper bounds on the maximum expansion achievable by a broad class of randomness amplifiers. In particular, we show that non-adaptive randomness amplifiers that are robust to noise cannot achieve more than doubly exponential expansion. Finally, we show that a wide class of protocols based on the use of the CHSH game can only lead to (singly) exponential expansion if adversarial devices are allowed the full power of non-signaling strategies. Our upper bound results apply to all known non-adaptive randomness amplifier constructions to date.Comment: 28 pages. Comments welcom

    Simple extractors via constructions of cryptographic pseudo-random generators

    Full text link
    Trevisan has shown that constructions of pseudo-random generators from hard functions (the Nisan-Wigderson approach) also produce extractors. We show that constructions of pseudo-random generators from one-way permutations (the Blum-Micali-Yao approach) can be used for building extractors as well. Using this new technique we build extractors that do not use designs and polynomial-based error-correcting codes and that are very simple and efficient. For example, one extractor produces each output bit separately in O(log2n)O(\log^2 n) time. These extractors work for weak sources with min entropy λn\lambda n, for arbitrary constant λ>0\lambda > 0, have seed length O(log2n)O(\log^2 n), and their output length is nλ/3\approx n^{\lambda/3}.Comment: 21 pages, an extended abstract will appear in Proc. ICALP 2005; small corrections, some comments and references adde

    Secure self-calibrating quantum random bit generator

    Get PDF
    Random bit generators (RBGs) are key components of a variety of information processing applications ranging from simulations to cryptography. In particular, cryptographic systems require "strong" RBGs that produce high-entropy bit sequences, but traditional software pseudo-RBGs have very low entropy content and therefore are relatively weak for cryptography. Hardware RBGs yield entropy from chaotic or quantum physical systems and therefore are expected to exhibit high entropy, but in current implementations their exact entropy content is unknown. Here we report a quantum random bit generator (QRBG) that harvests entropy by measuring single-photon and entangled two-photon polarization states. We introduce and implement a quantum tomographic method to measure a lower bound on the "min-entropy" of the system, and we employ this value to distill a truly random bit sequence. This approach is secure: even if an attacker takes control of the source of optical states, a secure random sequence can be distilled.Comment: 5 pages, 2 figure

    A PCP Characterization of AM

    Get PDF
    We introduce a 2-round stochastic constraint-satisfaction problem, and show that its approximation version is complete for (the promise version of) the complexity class AM. This gives a `PCP characterization' of AM analogous to the PCP Theorem for NP. Similar characterizations have been given for higher levels of the Polynomial Hierarchy, and for PSPACE; however, we suggest that the result for AM might be of particular significance for attempts to derandomize this class. To test this notion, we pose some `Randomized Optimization Hypotheses' related to our stochastic CSPs that (in light of our result) would imply collapse results for AM. Unfortunately, the hypotheses appear over-strong, and we present evidence against them. In the process we show that, if some language in NP is hard-on-average against circuits of size 2^{Omega(n)}, then there exist hard-on-average optimization problems of a particularly elegant form. All our proofs use a powerful form of PCPs known as Probabilistically Checkable Proofs of Proximity, and demonstrate their versatility. We also use known results on randomness-efficient soundness- and hardness-amplification. In particular, we make essential use of the Impagliazzo-Wigderson generator; our analysis relies on a recent Chernoff-type theorem for expander walks.Comment: 18 page

    Security of practical private randomness generation

    Full text link
    Measurements on entangled quantum systems necessarily yield outcomes that are intrinsically unpredictable if they violate a Bell inequality. This property can be used to generate certified randomness in a device-independent way, i.e., without making detailed assumptions about the internal working of the quantum devices used to generate the random numbers. Furthermore these numbers are also private, i.e., they appear random not only to the user, but also to any adversary that might possess a perfect description of the devices. Since this process requires a small initial random seed, one usually speaks of device-independent randomness expansion. The purpose of this paper is twofold. First, we point out that in most real, practical situations, where the concept of device-independence is used as a protection against unintentional flaws or failures of the quantum apparatuses, it is sufficient to show that the generated string is random with respect to an adversary that holds only classical-side information, i.e., proving randomness against quantum-side information is not necessary. Furthermore, the initial random seed does not need to be private with respect to the adversary, provided that it is generated in a way that is independent from the measured systems. The devices, though, will generate cryptographically-secure randomness that cannot be predicted by the adversary and thus one can, given access to free public randomness, talk about private randomness generation. The theoretical tools to quantify the generated randomness according to these criteria were already introduced in [S. Pironio et al, Nature 464, 1021 (2010)], but the final results were improperly formulated. The second aim of this paper is to correct this inaccurate formulation and therefore lay out a precise theoretical framework for practical device-independent randomness expansion.Comment: 18 pages. v3: important changes: the present version focuses on security against classical side-information and a discussion about the significance of these results has been added. v4: minor changes. v5: small typos correcte

    From Low-Distortion Norm Embeddings to Explicit Uncertainty Relations and Efficient Information Locking

    Full text link
    The existence of quantum uncertainty relations is the essential reason that some classically impossible cryptographic primitives become possible when quantum communication is allowed. One direct operational manifestation of these uncertainty relations is a purely quantum effect referred to as information locking. A locking scheme can be viewed as a cryptographic protocol in which a uniformly random n-bit message is encoded in a quantum system using a classical key of size much smaller than n. Without the key, no measurement of this quantum state can extract more than a negligible amount of information about the message, in which case the message is said to be "locked". Furthermore, knowing the key, it is possible to recover, that is "unlock", the message. In this paper, we make the following contributions by exploiting a connection between uncertainty relations and low-distortion embeddings of L2 into L1. We introduce the notion of metric uncertainty relations and connect it to low-distortion embeddings of L2 into L1. A metric uncertainty relation also implies an entropic uncertainty relation. We prove that random bases satisfy uncertainty relations with a stronger definition and better parameters than previously known. Our proof is also considerably simpler than earlier proofs. We apply this result to show the existence of locking schemes with key size independent of the message length. We give efficient constructions of metric uncertainty relations. The bases defining these metric uncertainty relations are computable by quantum circuits of almost linear size. This leads to the first explicit construction of a strong information locking scheme. Moreover, we present a locking scheme that is close to being implementable with current technology. We apply our metric uncertainty relations to exhibit communication protocols that perform quantum equality testing.Comment: 60 pages, 5 figures. v4: published versio

    Перспективи розвитку експортоорієнтованої стратегії підприємств

    Get PDF
    Рассмотрен вопрос стратегического развития экспортноориентрованной политики предприятий. Раскрыты перспективы развития международных торговых отношений Украины.Розглянуто питання стратегічного розвитку експортноорієнтовної політики підприємств. Розкрито перспективи розвитку міжнародних торгівельних відносин України

    Constructive Dimension and Turing Degrees

    Full text link
    This paper examines the constructive Hausdorff and packing dimensions of Turing degrees. The main result is that every infinite sequence S with constructive Hausdorff dimension dim_H(S) and constructive packing dimension dim_P(S) is Turing equivalent to a sequence R with dim_H(R) <= (dim_H(S) / dim_P(S)) - epsilon, for arbitrary epsilon > 0. Furthermore, if dim_P(S) > 0, then dim_P(R) >= 1 - epsilon. The reduction thus serves as a *randomness extractor* that increases the algorithmic randomness of S, as measured by constructive dimension. A number of applications of this result shed new light on the constructive dimensions of Turing degrees. A lower bound of dim_H(S) / dim_P(S) is shown to hold for the Turing degree of any sequence S. A new proof is given of a previously-known zero-one law for the constructive packing dimension of Turing degrees. It is also shown that, for any regular sequence S (that is, dim_H(S) = dim_P(S)) such that dim_H(S) > 0, the Turing degree of S has constructive Hausdorff and packing dimension equal to 1. Finally, it is shown that no single Turing reduction can be a universal constructive Hausdorff dimension extractor, and that bounded Turing reductions cannot extract constructive Hausdorff dimension. We also exhibit sequences on which weak truth-table and bounded Turing reductions differ in their ability to extract dimension.Comment: The version of this paper appearing in Theory of Computing Systems, 45(4):740-755, 2009, had an error in the proof of Theorem 2.4, due to insufficient care with the choice of delta. This version modifies that proof to fix the error
    corecore